robotic finger
Experimental Characterization of Fingertip Trajectory following for a 3-DoF Series-Parallel Hybrid Robotic Finger
Baiata, Nicholas, Chakraborty, Nilanjan
Abstract-- T ask-space control of robotic fingers is a critical enabler of dexterous manipulation, as manipulation objectives are most naturally specified in terms of fingertip motions and applied forces rather than individual joint angles. While task-space planning and control have been extensively studied for larger, arm-scale manipulators, demonstrations of precise task-space trajectory tracking in compact, multi-DoF robotic fingers remain scarce. In this paper, we present the physical prototyping and experimental characterization of a three-degree-of-freedom, linkage-driven, series-parallel robotic finger with analytic forward kinematics and a closed-form Jacobian. A resolved motion rate control (RMRC) scheme is implemented to achieve closed-loop task-space trajectory tracking. We experimentally evaluate the fingertip tracking performance across a variety of trajectories, including straight lines, circles, and more complex curves, and report millimeter-level accuracy. T o the best of our knowledge, this work provides one of the first systematic experimental demonstrations of precise task-space trajectory tracking in a linkage-driven robotic finger, thereby establishing a benchmark for future designs aimed at dexterous in-hand manipulation. I. INTRODUCTION Task-space control is a cornerstone of modern robotics because it allows specifying and executing motions directly in terms of end-effector positions and orientations, which are quantities most relevant to manipulation tasks. In dexterous manipulation, we are rarely interested in individual joint angles; rather, we care about applying forces, displacements, and velocities at specific points on the fingertips or the grasped object.
- North America > United States > New York > Suffolk County > Stony Brook (0.04)
- Europe > Switzerland > Basel-City > Basel (0.04)
- Europe > France > Provence-Alpes-Côte d'Azur > Alpes-Maritimes > Nice (0.04)
- Asia > South Korea > Daegu > Daegu (0.04)
Noninvasive brain tech and AI moves robotic hand with thought
Thanks to a team at the University of California, Davis, there's a new brain-computer interface (BCI) system that's opening up real-time, natural conversation for people who can't speak. Noninvasive brain tech is transforming how people interact with robotic devices. Instead of relying on muscle movement, this technology allows a person to control a robotic hand by simply thinking about moving his fingers. Instead, a set of sensors is placed on the scalp to detect brain signals. These signals are then sent to a computer.
Soft Vision-Based Tactile-Enabled SixthFinger: Advancing Daily Objects Manipulation for Stroke Survivors
Hasanen, Basma, Mohsan, Mashood M., Alkayas, Abdulaziz Y., Renda, Federico, Hussain, Irfan
The presence of post-stroke grasping deficiencies highlights the critical need for the development and implementation of advanced compensatory strategies. This paper introduces a novel system to aid chronic stroke survivors through the development of a soft, vision-based, tactile-enabled extra robotic finger. By incorporating vision-based tactile sensing, the system autonomously adjusts grip force in response to slippage detection. This synergy not only ensures mechanical stability but also enriches tactile feedback, mimicking the dynamics of human-object interactions. At the core of our approach is a transformer-based framework trained on a comprehensive tactile dataset encompassing objects with a wide range of morphological properties, including variations in shape, size, weight, texture, and hardness. Furthermore, we validated the system's robustness in real-world applications, where it successfully manipulated various everyday objects. The promising results highlight the potential of this approach to improve the quality of life for stroke survivors.
- Asia > Middle East > UAE > Abu Dhabi Emirate > Abu Dhabi (0.14)
- Europe > Denmark > Capital Region > Copenhagen (0.04)
- Asia > South Korea (0.04)
- Health & Medicine > Therapeutic Area > Cardiology/Vascular Diseases (1.00)
- Health & Medicine > Therapeutic Area > Neurology (0.92)
- Health & Medicine > Therapeutic Area > Hematology (0.92)
3D Printable Gradient Lattice Design for Multi-Stiffness Robotic Fingers
Schouten, Siebe J., Steenman, Tomas, File, Rens, Hartog, Merlijn Den, Sakes, Aimee, Della Santina, Cosimo, Lussenburg, Kirsten, Shahabi, Ebrahim
Human fingers achieve exceptional dexterity and adaptability by combining structures with varying stiffness levels, from soft tissues (low) to tendons and cartilage (medium) to bones (high). This paper explores developing a robotic finger with similar multi-stiffness characteristics. Specifically, we propose using a lattice configuration, parameterized by voxel size and unit cell geometry, to optimize and achieve fine-tuned stiffness properties with high granularity. A significant advantage of this approach is the feasibility of 3D printing the designs in a single process, eliminating the need for manual assembly of elements with differing stiffness. Based on this method, we present a novel, human-like finger, and a soft gripper. We integrate the latter with a rigid manipulator and demonstrate the effectiveness in pick and place tasks.
- Europe > Netherlands > South Holland > Delft (0.06)
- North America > United States (0.05)
- Europe > Germany (0.04)
- (3 more...)
- Health & Medicine (0.93)
- Machinery > Industrial Machinery (0.35)
Would YOU let a robot check your breasts for lumps? Ultra-sensitive robotic 'finger' could be used to diagnose cancer earlier
An ultra-sensitive robotic'finger' that could help detect breast cancer is being developed by scientists. Experts have created a device with a sophisticated sense of touch that can take patient pulses and check for abnormal lumps. The technology could make it easier for doctors to detect diseases such as breast cancer early on, when they are more treatable. And it may also help patients feel at ease during physical examinations that can seem uncomfortable and invasive, the researchers said. While rigid robotic fingers already exist, experts have raised concerns that these devices might not be up to the delicate tasks required in a doctor's office setting.
Identification and validation of the dynamic model of a tendon-driven anthropomorphic finger
Li, Junnan, Chen, Lingyun, Ringwald, Johannes, Fortunic, Edmundo Pozo, Ganguly, Amartya, Haddadin, Sami
This study addresses the absence of an identification framework to quantify a comprehensive dynamic model of human and anthropomorphic tendon-driven fingers, which is necessary to investigate the physiological properties of human fingers and improve the control of robotic hands. First, a generalized dynamic model was formulated, which takes into account the inherent properties of such a mechanical system. This includes rigid-body dynamics, coupling matrix, joint viscoelasticity, and tendon friction. Then, we propose a methodology comprising a series of experiments, for step-wise identification and validation of this dynamic model. Moreover, an experimental setup was designed and constructed that features actuation modules and peripheral sensors to facilitate the identification process. To verify the proposed methodology, a 3D-printed robotic finger based on the index finger design of the Dexmart hand was developed, and the proposed experiments were executed to identify and validate its dynamic model. This study could be extended to explore the identification of cadaver hands, aiming for a consistent dataset from a single cadaver specimen to improve the development of musculoskeletal hand models.
- Europe > Germany > North Rhine-Westphalia > Upper Bavaria > Munich (0.04)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
PINN-Ray: A Physics-Informed Neural Network to Model Soft Robotic Fin Ray Fingers
Wang, Xing, Dabrowski, Joel Janek, Pinskier, Josh, Liow, Lois, Viswanathan, Vinoth, Scalzo, Richard, Howard, David
Modelling complex deformation for soft robotics provides a guideline to understand their behaviour, leading to safe interaction with the environment. However, building a surrogate model with high accuracy and fast inference speed can be challenging for soft robotics due to the nonlinearity from complex geometry, large deformation, material nonlinearity etc. The reality gap from surrogate models also prevents their further deployment in the soft robotics domain. In this study, we proposed a physics-informed Neural Networks (PINNs) named PINN-Ray to model complex deformation for a Fin Ray soft robotic gripper, which embeds the minimum potential energy principle from elastic mechanics and additional high-fidelity experimental data into the loss function of neural network for training. This method is significant in terms of its generalisation to complex geometry and robust to data scarcity as compared to other data-driven neural networks. Furthermore, it has been extensively evaluated to model the deformation of the Fin Ray finger under external actuation. PINN-Ray demonstrates improved accuracy as compared with Finite element modelling (FEM) after applying the data assimilation scheme to treat the sim-to-real gap. Additionally, we introduced our automated framework to design, fabricate soft robotic fingers, and characterise their deformation by visual tracking, which provides a guideline for the fast prototype of soft robotics.
DeepMind is experimenting with a nearly indestructible robot hand
A new robot hand provides extremely fast and flexible finger movements, while also being tough enough to survive intense damage. That durability helps the hand, which is already being used in Google DeepMind's robotics experiments, during the trial-and-error learning required to train artificial intelligence. This latest robotic hand developed by the UK-based Shadow Robot Company can go from fully open to closed within 500 milliseconds and perform a fingertip pinch with up to 10 newtons of force. It can also withstand repeated punishment such as pistons punching the fingers from multiple angles or a person smashing the device with a hammer. The new hand's robust design is well suited for AI-powered robotics experiments based on reinforcement learning, which allows robots to gradually learn how to interact with environments by fumbling through tasks using trial and error, says Ram Ramamoorthy at the University of Edinburgh in the UK.
The Multi-fingered Kinematic Model for Dual-arm Manipulation
A planar kinematic model in the hand-object coordinates system for bimanual manipulation is presented. It can compute and determine the fingers configurations. In our experiment, the desired positions, as the model inputs are successfully generated valid joints values for bimanual manipulation. Abstract This paper presents the planar finger kinematic model for dual-arm robot to determine manipulation strategies. The first step is to model based on planar geometric features of the coordinated and rolling motion so that the robot can select the fingers configurations. For the hand-object model, we consider the distances between object and hands as the constraints. The second step is to seek the appropriate values of finger joints based on their positions samples which are randomly generated. Here the robot selects these positions according to the displacements of each joint and the k means clustering. The simulation shows that the selected solutions for the manipulation are all in the finger work space.
Robotic hand can identify objects with just one grasp
MIT researchers developed a soft-rigid robotic finger that incorporates powerful sensors along its entire length, enabling them to produce a robotic hand that could accurately identify objects after only one grasp. Inspired by the human finger, MIT researchers have developed a robotic hand that uses high-resolution touch sensing to accurately identify an object after grasping it just one time. Many robotic hands pack all their powerful sensors into the fingertips, so an object must be in full contact with those fingertips to be identified, which can take multiple grasps. Other designs use lower-resolution sensors spread along the entire finger, but these don't capture as much detail, so multiple regrasps are often required. Instead, the MIT team built a robotic finger with a rigid skeleton encased in a soft outer layer that has multiple high-resolution sensors incorporated under its transparent "skin."